13 research outputs found

    Bacteria Hunt: A multimodal, multiparadigm BCI game

    Get PDF
    Brain-Computer Interfaces (BCIs) allow users to control applications by brain activity. Among their possible applications for non-disabled people, games are promising candidates. BCIs can enrich game play by the mental and affective state information they contain. During the eNTERFACE’09 workshop we developed the Bacteria Hunt game which can be played by keyboard and BCI, using SSVEP and relative alpha power. We conducted experiments in order to investigate what difference positive vs. negative neurofeedback would have on subjects’ relaxation states and how well the different BCI paradigms can be used together. We observed no significant difference in mean alpha band power, thus relaxation, and in user experience between the games applying positive and negative feedback. We also found that alpha power before SSVEP stimulation was significantly higher than alpha power during SSVEP stimulation indicating that there is some interference between the two BCI paradigms

    Brain-Computer Interfaces based on multisensory Event-Related Potentials

    No full text
    This dissertation is organized along the four research questions in the form of scientific papers. We here give an overview of each chapter, their motivations, and the relation between them. Chapter 2: Controlling a tactile ERP-BCI in a dual-task. In this chapter we present the evaluation of what the costs of mental resources are to control a tactile ERP-BCI while at the same time performing a concurrent task using visual information. This is the first step towards applying a tactile ERP-BCI for navigation. For tasks like (serious) gaming cognitive resources are required, but when operating an ERP-BCI attending to stimuli also demands (cognitive) resources. We investigate whether or not these two tasks can be performed simultaneously, and what the effects on brain signals (and subsequently BCI performance) and task performance are. Chapter 3: Does bimodal stimulus presentation increase ERP components usable in BCIs? In this chapter we report the idea to increase ERP activity by means of bimodal (visual-tactile) stimulus presentation, with the goal to enhance BCI performance. Bimodal stimuli could evoke additional brain activity due to multisensory integration which may be of use in a BCI. We investigate effects of attending to bimodal visual-tactile (compared to unimodal) stimuli on the ERP. To this end we use stimulus pairs of tactile stimuli around the waist and visual stimuli embedded in a navigation environment presented on a display, corresponding in navigation direction. Chapter 4: Bimodal location-congruent ERP-BCIs: Increasing gaze-independent performance. In this chapter we further investigate bimodal (visual-tactile) ERP-BCIs and the role of location-congruency of the bimodal stimulus. Research has shown that bimodal stimuli do not necessarily have to be location-congruent to observe positive bimodal effects on task performance and brain activity, yet location-congruent bimodal stimuli may (further) positively affect task performance and ERP components. Whereas in chapter 3 we use a gaze-dependent setup as a first step and to compare results to traditional BCIs, in chapter 4 we take the next step by using a gaze-independent setup. In the latter case, the potential benefits of bimodal stimuli are expected to be greater as gaze-independent BCI performance is typically relatively low. Additionally, we study the effect of selectively attending to a modality in bimodal BCIs. Chapter 5: Control-display mapping in brain–computer interfaces. In this chapter we present our research on the effect of congruency regarding the relation between command options and stimuli in a BCI-context. When using a tactile ERP-BCI for navigation, mapping is required between navigation directions on a visual display and unambiguously corresponding tactile stimuli from a tactile control device: control-display mapping (CDM). Chapter 6: Discussion and conclusions. We discuss the results of the separate studies and integrate the studies to answer the main research question. Furthermore, we discuss the implications of our results, reflect on the usefulness of ERP-BCI for direct control and for other purposes, and make recommendations for future research. We finalise with some concluding remarks

    BCIs in multimodal interaction and multitask environments : Theoretical issues and initial guidelines

    No full text
    The development of Brain Computer Interfaces (BCIs) enters a phase in which these devices are no longer restricted to applications in controlled, single-task environments. For instance, BCIs for gaming or high-end operator stations will function as part of a multimodal user interface in a multitask environment. This phase introduces new issues that were not relevant for the development of the initial special-use applications and we should address these issues systematically. In this paper, we will present the potential conflicts and how models of information processing can help to cope with these. We will conclude with providing guidelines. © 2011 Springer-Verlag

    Navigation with a passive brain based interface

    No full text
    In this paper, we describe a Brain Computer Interface (BCI) for navigation. The system is based on detecting brain signals that are elicited by tactile stimulation on the torso indicating the desired direction

    EEG-Based Navigation from a Human Factors Perspective

    No full text
    Overzicht van Brain Computer interfaces voor navigatie in een framework gebaseerd op type BCI en navigatie subtaa

    Does bimodal stimulus presentation increase ERP components usable in BCIs?

    No full text
    Event-related potential (ERP)-based brain–computer interfaces (BCIs) employ differences in brain responses to attended and ignored stimuli. Typically, visual stimuli are used. Tactile stimuli have recently been suggested as a gaze-independent alternative. Bimodal stimuli could evoke additional brain activity due to multisensory integration which may be of use in BCIs. We investigated the effect of visual–tactile stimulus presentation on the chain of ERP components, BCI performance (classification accuracies and bitrates) and participants' task performance (counting of targets). Ten participants were instructed to navigate a visual display by attending (spatially) to targets in sequences of either visual, tactile or visual–tactile stimuli. We observe that attending to visual–tactile (compared to either visual or tactile) stimuli results in an enhanced early ERP component (N1). This bimodal N1 may enhance BCI performance, as suggested by a nonsignificant positive trend in offline classification accuracies. A late ERP component (P300) is reduced when attending to visual–tactile compared to visual stimuli, which is consistent with the nonsignificant negative trend of participants' task performance. We discuss these findings in the light of affected spatial attention at high-level compared to low-level stimulus processing. Furthermore, we evaluate bimodal BCIs from a practical perspective and for future application

    Exploring the use of feedback in ERP-based BCI

    No full text
    Giving direct feedback on a mental state is common practice in motor imagery based brain-computer interfaces (BCI), but has not been reported for those based on eventrelated potentials (ERP). Potentially, such feedback could allow the user to adjust his strategy during a running trial to obtain the required response. In order to test the usefulness of such feedback, spatially congruent vibrotactile feedback was given during an online auditory BCI experiment. Users received either no feedback, short feedback pulses or continuous feedback. Results showed that feedback had a deteriorating effect, both on classification performance and behavioral scores (target stimulus counting). Furthermore, most subjects preferred the no feedback condition over those with feedback. Under these conditions, the use of direct feedback in ERP based BCIs can thus not be recommended

    Bacteria Hunt: Evaluating multi-paradigm BCI interaction

    No full text
    The multimodal, multi-paradigm brain-computer interfacing (BCI) game Bacteria Hunt was used to evaluate two aspects of BCI interaction in a gaming context. One goal was to examine the effect of feedback on the ability of the user to manipulate his mental state of relaxation. This was done by having one condition in which the subject played the game with real feedback, and another with sham feedback. The feedback did not seem to affect the game experience (such as sense of control and tension) or the objective indicators of relaxation, alpha activity and heart rate. The results are discussed with regard to clinical neurofeedback studies. The second goal was to look into possible interactions between the two BCI paradigms used in the game: steady-state visually-evoked potentials (SSVEP) as an indicator of concentration, and alpha activity as a measure of relaxation. SSVEP stimulation activates the cortex and can thus block the alpha rhythm. Despite this effect, subjects were able to keep their alpha power up, in compliance with the instructed relaxation task. In addition to the main goals, a new SSVEP detection algorithm was developed and evaluated. © 2010 The Author(s)

    B cell targets in rheumatoid arthritis

    No full text
    corecore